Accepted for publication in the Transactions of the American Math. Society (1990) GENERALIZED SECOND DERIVATIVES OF CONVEX FUNCTIONS AND SADDLE FUNCTIONS

نویسنده

  • R. T. Rockafellar
چکیده

The theory of second-order epi-derivatives of extended-real-valued functions is applied to convex functions on lR and shown to be closely tied to proto-differentiation of the corresponding subgradient multifunctions, as well as to second-order epi-differentiation of conjugate functions. An extension is then made to saddle functions, which by definition are convex in one argument and concave in another. For this case a concept of epi-hypo-differentiability is introduced. The saddle function results provide a foundation for the sensitivity analysis of primal and dual optimal solutions to general finite-dimensional problems in convex optimization, since such solutions are characterized as saddle-points of a convex-concave Lagrangian functions, or equivalently as subgradients of the saddle function conjugate to the Lagrangian.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient approaches for escaping higher order saddle points in non-convex optimization

Local search heuristics for non-convex optimizations are popular in applied machine learning. However, in general it is hard to guarantee that such algorithms even converge to a local minimum, due to the existence of complicated saddle point structures in high dimensions. Many functions have degenerate saddle points such that the first and second order derivatives cannot distinguish them with l...

متن کامل

Saddle Point and Second Order Optimality in Nondifferentiable Nonlinear Abstract Multiobjective Optimization

This article deals with a vector optimization problem with cone constraints in a Banach space setting. By making use of a real-valued Lagrangian and the concept of generalized subconvex-like functions, weakly efficient solutions are characterized through saddle point type conditions. The results, jointly with the notion of generalized Hessian (introduced in [Cominetti, R., Correa, R.: A general...

متن کامل

On the convergence of conditional epsilon-subgradient methods for convex programs and convex-concave saddle-point problems

The paper provides two contributions. First, we present new convergence results for conditional e-subgradient algorithms for general convex programs. The results obtained here extend the classical ones by Polyak [Sov. Math. Doklady 8 (1967) 593; USSR Comput. Math. Math. Phys. 9 (1969) 14; Introduction to Optimization, Optimization Software, New York, 1987] as well as the recent ones in [Math. P...

متن کامل

Accuracy Certificates for Computational Problems with Convex Structure

The goal of this paper is to introduce the notion of certificates, which verify the accuracy of solutions of computational problems with convex structure. Such problems include minimizing convex functions, variational inequalities with monotone operators, computing saddle points of convex-concave functions, and solving convex Nash equilibrium problems. We demonstrate how the implementation of t...

متن کامل

Conjugacy Correspondences : a Unified View

As preparation for a duality theory for saddle programs, a partial conjugacy correspondence is developed among equivalence classes of saddle functions. Three known conjugacy correspondences, including Fenchel's correspondence among convex functions and Rockafellar's extension of it to equivalence classes of saddle functions, are shown to be degenerate special cases. Additionally, two new corres...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007